Goto

Collaborating Authors

 new medical generalist


Generative A.I. and the New Medical Generalist

#artificialintelligence

In the journal Nature today, my colleagues and I published an article on the future directions of generative A.I. (aka Large Language or Foundation models) for the practice of medicine. These new AI models have generated a multitude of new and exciting opportunities in healthcare that we didn't have before, along with many challenges and liabilities. I'll briefly explain how we got here and what's in store. Back in 2017, Google researchers published a paper ("Attention Is All You Need") describing a new model architecture, which they dubbed Transformer, that could give different levels of attention for multiple modes of input, and go faster, to ultimately replace recurrent and convolutional deep neural networks (RNN and CNN, respectively). Foreshadowing the future to Generative AI, they concluded: "We plan to extend the Transformer to problems involving input and output modalities other than text and to investigate local, restricted attention mechanisms to efficiently handle large inputs and outputs such as images, audio and video."